Goto

Collaborating Authors

 hypothesis set stability and generalization


Hypothesis Set Stability and Generalization

Neural Information Processing Systems

We present a study of generalization for data-dependent hypothesis sets. We give a general learning guarantee for data-dependent hypothesis sets based on a notion of transductive Rademacher complexity. Our main result is a generalization bound for data-dependent hypothesis sets expressed in terms of a notion of hypothesis set stability and a notion of Rademacher complexity for data-dependent hypothesis sets that we introduce. This bound admits as special cases both standard Rademacher complexity bounds and algorithm-dependent uniform stability bounds. We also illustrate the use of these learning bounds in the analysis of several scenarios.


Hypothesis Set Stability and Generalization

Neural Information Processing Systems

We present a study of generalization for data-dependent hypothesis sets. We give a general learning guarantee for data-dependent hypothesis sets based on a notion of transductive Rademacher complexity. Our main result is a generalization bound for data-dependent hypothesis sets expressed in terms of a notion of hypothesis set stability and a notion of Rademacher complexity for data-dependent hypothesis sets that we introduce. This bound admits as special cases both standard Rademacher complexity bounds and algorithm-dependent uniform stability bounds. We also illustrate the use of these learning bounds in the analysis of several scenarios.


Reviews: Hypothesis Set Stability and Generalization

Neural Information Processing Systems

A risk bound for data-dependent hypothesis classes is presented in terms of a notion of stability of the hypothesis class and a newly proposed extension of the Rademacher complexity to data-dependent classes. The paper is clearly written and the results are interesting and mathematically sound. The unification of the complexity-based and stability-based analysis for learning with data-dependent hypothesis seems a significant contribution. Their main theoretical result (Theorem 2) applies to a large range of learning algorithms and is thus relevant to a large body of machine learning work. A nice analysis is presented for bagging, stochastic strongly convex optimisation, and distillation.

  hypothesis class, hypothesis set stability and generalization

Hypothesis Set Stability and Generalization

Neural Information Processing Systems

We present a study of generalization for data-dependent hypothesis sets. We give a general learning guarantee for data-dependent hypothesis sets based on a notion of transductive Rademacher complexity. Our main result is a generalization bound for data-dependent hypothesis sets expressed in terms of a notion of hypothesis set stability and a notion of Rademacher complexity for data-dependent hypothesis sets that we introduce. This bound admits as special cases both standard Rademacher complexity bounds and algorithm-dependent uniform stability bounds. We also illustrate the use of these learning bounds in the analysis of several scenarios.


Hypothesis Set Stability and Generalization

Foster, Dylan J., Greenberg, Spencer, Kale, Satyen, Luo, Haipeng, Mohri, Mehryar, Sridharan, Karthik

Neural Information Processing Systems

We present a study of generalization for data-dependent hypothesis sets. We give a general learning guarantee for data-dependent hypothesis sets based on a notion of transductive Rademacher complexity. Our main result is a generalization bound for data-dependent hypothesis sets expressed in terms of a notion of hypothesis set stability and a notion of Rademacher complexity for data-dependent hypothesis sets that we introduce. This bound admits as special cases both standard Rademacher complexity bounds and algorithm-dependent uniform stability bounds. We also illustrate the use of these learning bounds in the analysis of several scenarios.